A Unifying View of Message Passing algorithms for Gaussian MRFs

نویسنده

  • Ananth Ranganathan
چکیده

We illustrate the relationship between message passing algorithms in GMRFs and matrix factorization algorithms. Specifically, we show that message passing on trees is equivalent to Gaussian elimination, while Loopy Belief Propagation is equivalent to Gauss-Seidel relaxation. Similarly, recently introduced message passing algorithms such as the Extended Message Passing, and the Embedded Subgraphs algorithm are also shown to be equivalent to commonly known matrix methods. We describe efficient extensions to message passing algorithms based on exploiting these relationships.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Hinge-Loss Markov Random Fields and Probabilistic Soft Logic

This paper introduces hinge-loss Markov random fields (HL-MRFs), a new class of probabilistic graphical models particularly well-suited to large-scale structured prediction and learning. We derive HL-MRFs by unifying and then generalizing three different approaches to scalable inference in structured models: (1) randomized algorithms for MAX SAT, (2) local consistency relaxation for Markov rand...

متن کامل

Lifted Message Passing as Reparametrization of Graphical Models

Lifted inference approaches can considerably speed up probabilistic inference in Markov random fields (MRFs) with symmetries. Given evidence, they essentially form a lifted, i.e., reduced factor graph by grouping together indistinguishable variables and factors. Typically, however, lifted factor graphs are not amenable to offthe-shelf message passing (MP) approaches, and hence requires one to u...

متن کامل

Unifying Local Consistency and MAX SAT Relaxations for Scalable Inference with Rounding Guarantees

We prove the equivalence of first-order local consistency relaxations and the MAX SAT relaxation of Goemans and Williamson (1994) for a class of MRFs we refer to as logical MRFs. This allows us to combine the advantages of each into a single MAP inference technique: solving the local consistency relaxation with any of a number of highly scalable message-passing algorithms, and then obtaining a ...

متن کامل

Message passing with l1 penalized KL minimization

Bayesian inference is often hampered by large computational expense. As a generalization of belief propagation (BP), expectation propagation (EP) approximates exact Bayesian computation with efficient message passing updates. However, when an approximation family used by EP is far from exact posterior distributions, message passing may lead to poor approximation quality and suffer from divergen...

متن کامل

Convergent message passing algorithms - a unifying view

Message-passing algorithms have emerged as powerful techniques for approximate inference in graphical models. When these algorithms converge, they can be shown to find local (or sometimes even global) optima of variational formulations to the inference problem. But many of the most popular algorithms are not guaranteed to converge. This has lead to recent interest in convergent message-passing ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007